46 research outputs found

    Learning Deep Similarity Metric for 3D MR-TRUS Registration

    Full text link
    Purpose: The fusion of transrectal ultrasound (TRUS) and magnetic resonance (MR) images for guiding targeted prostate biopsy has significantly improved the biopsy yield of aggressive cancers. A key component of MR-TRUS fusion is image registration. However, it is very challenging to obtain a robust automatic MR-TRUS registration due to the large appearance difference between the two imaging modalities. The work presented in this paper aims to tackle this problem by addressing two challenges: (i) the definition of a suitable similarity metric and (ii) the determination of a suitable optimization strategy. Methods: This work proposes the use of a deep convolutional neural network to learn a similarity metric for MR-TRUS registration. We also use a composite optimization strategy that explores the solution space in order to search for a suitable initialization for the second-order optimization of the learned metric. Further, a multi-pass approach is used in order to smooth the metric for optimization. Results: The learned similarity metric outperforms the classical mutual information and also the state-of-the-art MIND feature based methods. The results indicate that the overall registration framework has a large capture range. The proposed deep similarity metric based approach obtained a mean TRE of 3.86mm (with an initial TRE of 16mm) for this challenging problem. Conclusion: A similarity metric that is learned using a deep neural network can be used to assess the quality of any given image registration and can be used in conjunction with the aforementioned optimization framework to perform automatic registration that is robust to poor initialization.Comment: To appear on IJCAR

    D\u27Amico Risk Stratification Correlates with Degree of Suspicion of Prostate Cancer on Multiparametric Magnetic Resonance Imaging.

    Get PDF
    PURPOSE: We determined whether there is a correlation between D\u27Amico risk stratification and the degree of suspicion of prostate cancer on multiparametric magnetic resonance imaging based on targeted biopsies done with our electromagnetically tracked magnetic resonance imaging/ultrasound fusion platform. MATERIALS AND METHODS: A total of 101 patients underwent 3 Tesla multiparametric magnetic resonance imaging of the prostate, consisting of T2, dynamic contrast enhanced, diffusion weighted and spectroscopy images in cases suspicious for or with a diagnosis of prostate cancer. All prostate magnetic resonance imaging lesions were then identified and graded by the number of positive modalities, including low-2 or fewer, moderate-3 and high-4 showing suspicion on multiparametric magnetic resonance imaging. The biopsy protocol included standard 12-core biopsy, followed by real-time magnetic resonance imaging/ultrasound fusion targeted biopsies of the suspicious magnetic resonance lesions. Cases and lesions were stratified by the D\u27Amico risk stratification. RESULTS: In this screening population 90.1% of men had a negative digital rectal examination. Mean±SD age was 62.7±8.3 years and median prostate specific antigen was 5.8 ng/ml. Of the cases 54.5% were positive for cancer on protocol biopsy. Chi-square analysis revealed a statistically significant correlation between magnetic resonance suspicion and D\u27Amico risk stratification (p CONCLUSIONS: Our data support the notion that using multiparametric magnetic resonance prostate imaging one may assess the degree of risk associated with magnetic resonance visible lesions in the prostate

    Segmenting TRUS Video Sequences Using Local Shape Statistics

    No full text
    Automatic segmentation of the prostate in transrectal ultrasound (TRUS) may improve the fusion of TRUS with magnetic resonance imaging (MRI) for TRUS/MRI-guided prostate biopsy and local therapy. It is very challenging to segment the prostate in TRUS images, especially for the base and apex of the prostate due to the large shape variation and low signal-to-noise ratio. To successfully segment the whole prostate from 2D TRUS video sequences, this paper presents a new model based algorithm using both global population-based and adaptive local shape statistics to guide segmentation. By adaptively learning shape statistics in a local neighborhood during the segmentation process, the algorithm can effectively capture the patient-specific shape statistics and the large shape variations in the base and apex areas. After incorporating the learned shape statistics into a deformable model, the proposed method can accurately segment the entire gland of the prostate with significantly improved performance in the base and apex. The proposed method segments TRUS video in a fully automatic fashion. In our experiments, 19 video sequences with 3064 frames in total grabbed from 19 different patients for prostate cancer biopsy were used for validation. It took about 200ms for segmenting one frame on a Core2 1.86 GHz PC. The average mean absolute distance (MAD) error was 1.65±0.47mm for the proposed method, compared to 2.50±0.81mm and 2.01±0.63mm for independent frame segmentation and frame segmentation result propagation, respectively. Furthermore, the proposed method reduced the MAD errors by 49.4% and 18.9% in the base and by 55.6% and 17.7% in the apex, respectively

    Discrete Deformable Model Guided by Partial Active Shape Model for TRUS Image Segmentation

    No full text

    Adaptively Learning Local Shape Statistics for Prostate Segmentation in Ultrasound

    No full text
    corecore